Mpirical a Nalysis of the H Essian of O Ver - P Arametrized N Eural N Etworks

نویسنده

  • Levent Sagun
چکیده

We study the properties of common loss surfaces through their Hessian matrix. In particular, in the context of deep learning, we empirically show that the spectrum of the Hessian is composed of two parts: (1) the bulk centered near zero, (2) and outliers away from the bulk. We present numerical evidence and mathematical justifications to the following conjectures laid out by Sagun et al. (2016): Fixing data, increasing the number of parameters merely scales the bulk of the spectrum; fixing the dimension and changing the data (for instance adding more clusters or making the data less separable) only affects the outliers. We believe that our observations have striking implications for non-convex optimization in high dimensions. First, the flatness of such landscapes (which can be measured by the singularity of the Hessian) implies that classical notions of basins of attraction may be quite misleading. And that the discussion of wide/narrow basins may be in need of a new perspective around over-parametrization and redundancy that are able to create large connected components at the bottom of the landscape. Second, the dependence of a small number of large eigenvalues to the data distribution can be linked to the spectrum of the covariance matrix of gradients of model outputs. With this in mind, we may reevaluate the connections within the data-architecturealgorithm framework of a model, hoping that it would shed light on the geometry of high-dimensional and non-convex spaces in modern applications. In particular, we present a case that links the two observations: small and large batch gradient descent appear to converge to different basins of attraction but we show that they are in fact connected through their flat region and so belong to the same basin.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Isk L Andscape a Nalysis for U Nder - Standing D Eep N Eural N Etworks

This work aims to provide comprehensive landscape analysis of empirical risk in deep neural networks (DNNs), including the convergence behavior of its gradient, its stationary points and the empirical risk itself to their corresponding population counterparts, which reveals how various network parameters determine the convergence performance. In particular, for an l-layer linear neural network ...

متن کامل

Published as a conference paper at ICLR 2018 S IMULATING A CTION D YNAMICS WITH N EURAL P ROCESS N ETWORKS

Understanding procedural language requires anticipating the causal effects of actions, even when they are not explicitly stated. In this work, we introduce Neural Process Networks to understand procedural text through (neural) simulation of action dynamics. Our model complements existing memory architectures with dynamic entity tracking by explicitly modeling actions as state transformers. The ...

متن کامل

Synthesis of electron-poor N-Vinylimidazole derivatives catalyzed by Silica nanoparticles under solvent-free conditions

Protonation of the highly reactive 1:1 intermediates, produced in the reaction between triphenylphosphine and acetylenic esters, by NH-acids such as azathioprine, imidazole or theophylline leads to the formation of vinyltriphenylphosphonium salts, which undergo a Michael addition reaction with a conjugate base to produce phosphorus ylides. Silica nanoparticles (silica NPs were prepared by therm...

متن کامل

Synthesis of electron-poor N-Vinylimidazole derivatives catalyzed by Silica nanoparticles under solvent-free conditions

Protonation of the highly reactive 1:1 intermediates, produced in the reaction between triphenylphosphine and acetylenic esters, by NH-acids such as azathioprine, imidazole or theophylline leads to the formation of vinyltriphenylphosphonium salts, which undergo a Michael addition reaction with a conjugate base to produce phosphorus ylides. Silica nanoparticles (silica NPs were prepared by therm...

متن کامل

Q Uantized B Ack - P Ropagation : T Raining B Ina - Rized N Eural N Etworks with Q Uantized G Ra - Dients

Binarized Neural networks (BNNs) have been shown to be effective in improving network efficiency during the inference phase, after the network has been trained. However, BNNs only binarize the model parameters and activations during propagations. We show there is no inherent difficulty in training BNNs using ”Quantized BackPropagation” (QBP), in which we also quantized the error gradients and i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018